Upgraded radar can enable self-driving cars to see clearly no matter the weather
- Date:
- November 17, 2020
- Source:
- University of California - San Diego
- Summary:
- A new kind of radar could make it possible for self-driving cars to navigate safely in bad weather. Electrical engineers developed a clever way to improve the imaging capability of existing radar sensors so that they accurately predict the shape and size of objects in the scene. The system worked well when tested at night and in foggy conditions.
- Share:
A new kind of radar could make it possible for self-driving cars to navigate safely in bad weather. Electrical engineers at the University of California San Diego developed a clever way to improve the imaging capability of existing radar sensors so that they accurately predict the shape and size of objects in the scene. The system worked well when tested at night and in foggy conditions.
The team will present their work at the Sensys conference Nov. 16 to 19.
Inclement weather conditions pose a challenge for self-driving cars. These vehicles rely on technology like LiDAR and radar to "see" and navigate, but each has its shortcomings. LiDAR, which works by bouncing laser beams off surrounding objects, can paint a high-resolution 3D picture on a clear day, but it cannot see in fog, dust, rain or snow. On the other hand, radar, which transmits radio waves, can see in all weather, but it only captures a partial picture of the road scene.
Enter a new UC San Diego technology that improves how radar sees.
"It's a LiDAR-like radar," said Dinesh Bharadia, a professor of electrical and computer engineering at the UC San Diego Jacobs School of Engineering. It's an inexpensive approach to achieving bad weather perception in self-driving cars, he noted. "Fusing LiDAR and radar can also be done with our techniques, but radars are cheap. This way, we don't need to use expensive LiDARs."
The system consists of two radar sensors placed on the hood and spaced an average car's width apart (1.5 meters). Having two radar sensors arranged this way is key -- they enable the system to see more space and detail than a single radar sensor.
During test drives on clear days and nights, the system performed as well as a LiDAR sensor at determining the dimensions of cars moving in traffic. Its performance did not change in tests simulating foggy weather. The team "hid" another vehicle using a fog machine and their system accurately predicted its 3D geometry. The LiDAR sensor essentially failed the test.
Two eyes are better than one
The reason radar traditionally suffers from poor imaging quality is because when radio waves are transmitted and bounced off objects, only a small fraction of signals ever gets reflected back to the sensor. As a result, vehicles, pedestrians and other objects appear as a sparse set of points.
"This is the problem with using a single radar for imaging. It receives just a few points to represent the scene, so the perception is poor. There can be other cars in the environment that you don't see," said Kshitiz Bansal, a computer science and engineering Ph.D. student at UC San Diego. "So if a single radar is causing this blindness, a multi-radar setup will improve perception by increasing the number of points that are reflected back."
The team found that spacing two radar sensors 1.5 meters apart on the hood of the car was the optimal arrangement. "By having two radars at different vantage points with an overlapping field of view, we create a region of high-resolution, with a high probability of detecting the objects that are present," Bansal said.
A tale of two radars
The system overcomes another problem with radar: noise. It is common to see random points, which do not belong to any objects, appear in radar images. The sensor can also pick up what are called echo signals, which are reflections of radio waves that are not directly from the objects that are being detected.
More radars mean more noise, Bharadia noted. So the team developed new algorithms that can fuse the information from two different radar sensors together and produce a new image free of noise.
Another innovation of this work is that the team constructed the first dataset combining data from two radars.
"There are currently no publicly available datasets with this kind of data, from multiple radars with an overlapping field of view," Bharadia said. "We collected our own data and built our own dataset for training our algorithms and for testing."
The dataset consists of 54,000 radar frames of driving scenes during the day and night in live traffic, and in simulated fog conditions. Future work will include collecting more data in the rain. To do this, the team will first need to build better protective covers for their hardware.
The team is now working with Toyota to fuse the new radar technology with cameras. The researchers say this could potentially replace LiDAR. "Radar alone cannot tell us the color, make or model of a car. These features are also important for improving perception in self-driving cars," Bharadia said.
Video: https://www.youtube.com/watch?v=5BrC0Jt4xUc&feature=emb_logo
Story Source:
Materials provided by University of California - San Diego. Note: Content may be edited for style and length.
Cite This Page: